Goto

Collaborating Authors

 human risk


Humans risk being overrun by artificial superintelligence in 30 years

#artificialintelligence

A MACHINE with human-level intelligence could be built in the next 30 years and could represent a threat to life on Earth, some experts believe. AI researchers and technology executives like Elon Musk are openly concerned about human extinction caused by machines. The Law of Accelerating Returns is a concept popularized by futurist Ray Kurzweil that states the rate of technological improvement is on a very steep curve. As technology gets more advanced, society and industry are better equipped to improve technology faster and more drastically. "With more powerful computers and related technology, we have the tools and the knowledge to design yet more powerful computers, and to do so more quickly," Kurzweil wrote in his famous 2001 essay.


Humans risk being unable to control artificial intelligence, scientists fear

#artificialintelligence

The Daily Star's FREE newsletter is spectacular! Scientists have warned that humanity risks losing control of Artificial Intelligence if it keeps developing. AI software is becoming more common, with companies such as Amazon trialling self-automated vehicles. Experts recently made a major breakthrough with a revolutionary new AI system that never stops learning. But as technology develops, an international group of researchers have warned that there are increasing dangers of standalone software. In a study published in the Journal of Arititial Intelligence Research Portal, author Manuel Cebrain said: "A super-intelligent machine that controls the world sounds like science fiction.